16 research outputs found

    IMPACTS OF DISTRACTION ON DRIVING: AN ANALYSIS OF PHYSICAL, COGNITIVE, AND EMOTIONAL DISTRACTION

    Get PDF
    Traditionally, driver distraction has been categorized into four types: visual, biomechanical, auditory, and cognitive. However, the place of emotion in distracted driving research is undefined. This research investigates the influence of emotional distraction on driving performance. In total, seventy-eight participants were recruited and placed into one of four conditions: physical (visual-biomechanical), cognitive (cognitive- auditory), emotional (anger), and control. The results demonstrated that emotional distraction degrades driving performance as much as or more than other distraction types. The causes for these results, underlying mechanisms, and other considerations are mentioned in the discussion section

    Towards an in-vehicle sonically-enhanced gesture control interface: A pilot study

    Get PDF
    A pilot study was conducted to explore the potential of sonically-enhanced gestures as controls for future in-vehicle information systems (IVIS). Four concept menu systems were developed using a LEAP Motion and Pure Data: (1) 2x2 with auditory feedback, (2) 2x2 without auditory feedback, (3) 4x4 with auditory feedback, and (4) 4x4 without auditory feedback. Seven participants drove in a simulator while completing simple target-acquisition tasks using each of the four prototype systems. Driving performance and eye glance behavior were collected as well as subjective ratings of workload and system preference. Results from driving performance and eye tracking measures strongly indicate that the 2x2 grids yield better driving safety outcomes than 4x4 grids. Subjective ratings show similar patterns for driver workload and preferences. Auditory feedback led to similar improvements in driving performance and eye glance behavior as well as subjective ratings of workload and preference, compared to visual-only

    Cultural differences in preference of auditory emoticons: USA and South Korea

    Get PDF
    For the last two decades, research on auditory displays and sonification has continuously increased. However, most research has focused on cognitive and functional mapping rather than emotional mapping. Moreover, there has not been much research on cultural differences on auditory displays. The present study compared user preference of auditory emoticons in two countries: USA and South Korea. Seventy students evaluated 112 auditory icons and 115 earcons regarding 30 emotional adjectives. Results indicated that they showed similar preference in the same category (auditory icons or earcons), but they showed different patterns when they were asked to select the best sound between the two categorical sounds. Implications for cultural differences in preference and directions for future design and research of auditory emoticons are discussed

    Influences of Visual and Auditory Displays on Aimed Movements Using Air Gesture Controls

    Get PDF
    With the proliferation of technologies operated via in-air hand movements, e.g. virtual/augmented reality, in-vehicle infotainment systems, and large public information displays, there remains an open question about if/how auditory displays can be used effectively to facilitate eyes-free aimed movements. We conducted a within-subjects study, similar to a Fitts paradigm study, in which 24 participants completed simple aimed movements to acquire targets of varying sizes and distances. Participants completed these aimed movements for six conditions – each presenting a unique combination of visual and auditory displays. Results showed participants were generally faster to make selections when using visual displays compared to displays without visuals. However, selection accuracy was similar for auditory-only displays when compared to displays with visual components. These results highlight the potential for auditory displays to aid aimed movements using air gestures in conditions where visual displays are impractical, impossible, or unhelpful

    Design and Evaluation of Auditory-Supported Air Gesture Controls in Vehicles

    Get PDF
    The number of visual distraction-caused crashes highlights a need for non-visual information displays in vehicles. Auditory-supported air gesture controls could fill that need. This dissertation covers four experiments that aim to explore the design auditory-supported air gesture system and examine its real-world influence on driving performance. The first three experiments compared different prototype gesture control designs as participants used the systems in a driving simulator. The fourth experiment sought to answer more basic questions about how auditory displays influence performance in target acquisition tasks. Results from experiment 1 offered optimism for the potential of auditory-supported displays for navigating simple menus by showing a decrease in off-road glance time compared to visual-only displays. Experiment 1 also showed a need to keep menu items small in number but large in size. Results from experiment 2 showed auditory-supported air gesture controls can result in safer driving performance relative to touchscreens, but at the cost of slight decrements in menu task performance. Results from experiment 3 showed that drivers can navigate through simple menu structures totally eyes-free, with no visual displays, even with less effort compared to visual displays and visual plus auditory displays. Experiment 4 showed that auditory displays convey information and allow for accurate target selection, but result in slower selections and relatively less accurate selections compared to displays with visual information, especially for more difficult target selections. Overall, the experimental data highlight potential for auditory-supported air gesture controls for increasing eyes-on-road time relative to visual displays both in touchscreens and air gesture controls. However, this benefit came at a slight cost to target selection performance as participants generally took longer to process auditory information in simple target acquisition tasks. Experimental results are discussed in the context of multiple resource theory and Fitts’s law. Design guidelines and future work are also discussed

    Impacts of anger on driving performance: A comparison to texting and conversation while driving

    No full text
    Traditionally, driver distraction has been categorized into four types: visual, biomechanical, auditory, and cognitive. However, the place of emotion in driving research is largely undefined. The present study investigates the specific influences of anger – representative emotion arisen while driving, on driving performance, compared to those of traditional distraction tasks. In total, seventy-eight participants were recruited and placed into one of four driving conditions: physical (visual-biomechanical) distraction, cognitive (cognitive-auditory) distraction, emotional (anger), and control conditions. The results demonstrated that anger degrades driving performance as much as or more than other distraction types, specifically, in a yellow traffic signal situation. The causes for these results, underlying mechanisms, and other considerations are discussed with implications for future research

    Eyes-free in-vehicle gesture controls: auditory-only displays reduced visual distraction and workload

    No full text
    Visual distractions increase crash risk while driving. Our research focuses on creating and evaluating an air gesture control system that is less visually demanding than current infotainment systems. We completed a within-subjects experiment with 24 participants, each of whom completed a simulated drive while using six different prototypes, in turn. The primary research questions were the influence of combinations of visual and auditory displays (visual, visual/auditory, auditory) and control orientation (vertical vs horizontal). We recorded lane departures, eye glance behavior, secondary task performance, and driver workload. Results demonstrated that for lane departures all prototypes performed comparably, with the auditory-only showing a strong tendency of improvements. A deeper look illustrated a tradeoff between eyes-on-road time and secondary task completion time for the auditory-only display -- the safest but slowest among the six prototypes. The auditory-only also reduced overall workload. Control orientation showed only small subjective effect in favor of vertical controls

    Design and evaluation of auditory-supported air gesture controls in vehicles

    No full text
    © 2019, Springer Nature Switzerland AG. Using touchscreens while driving introduces competition for visual attention that increases crash risk. To resolve this issue, we have developed an auditory-supported air gesture system. We conducted two experiments using the driving simulator to investigate the influence of this system on driving performance, eye glance behavior, secondary task performance, and driver workload. In Experiment 1 we investigated the impact of menu layout and auditory displays with 23 participants. In Experiment 2 we compared the best systems from Experiment 1 with equivalent touchscreen systems with 24 participants. Results from Experiment 1 showed that menus arranged in 2 × 2 grids outperformed systems with 4 × 4 grids across all measures and also demonstrated that auditory displays can be used to reduce visual demands of in-vehicle controls. In Experiment 2 auditory-supported air gestures allowed drivers to look at the road more, showed equivalent driver workload and driving performance, and slightly decreased secondary task performance compared to touchscreens. Implications are discussed with multiple resources theory and Fitts’s law

    In-vehicle air gesture design: impacts of display modality and control orientation

    No full text
    The number of visual distraction-caused crashes highlights a need for non-visual displays in the in-vehicle information system (IVIS). Audio-supported air gesture controls can tackle this problem. Twenty-four young drivers participated in our experiment using a driving simulator with six different gesture prototypes—3 modality types (visual-only, visual/auditory, and auditory-only) × 2 control orientation types (horizontal and vertical). Various data were obtained, including lane departures, eye glance behavior, secondary task performance, and driver workload. Results showed that the auditory-only displays showed a significantly lower lane departures and perceived workload. A tradeoff between eyes-on-road time and secondary task completion time for the auditory-only display was also observed, which means the safest, but slowest among the prototypes. Vertical controls (direct manipulation) showed significantly lower workload than horizontal controls (mouse metaphor), but did not differ in performance measures. Experimental results are discussed in the context of multiple resource theory and design guidelines for future implementation

    Development tool for rapid evaluation of eyes-free in-vehicle gesture controls

    No full text
    In-vehicle controls, such as navigation systems, radio dials, and climate controls, can be visually demanding and can increase crash risk. We are attempting to use commercially available gesture detection equipment to develop an eyes-free system that can provide access to the same controls without increasing crash risk. We envision a system that can be controlled by intuitive in-air gestures. Information about system status and gesture detection can be provided through an auditory menu display, rather than the visual modality as in touchscreen controls. Here, we describe our motivation to develop the system and describe a tool we have developed to help researchers, even those without programming experience, to configure multiple system designs for rapid usability evaluation and iteration practices
    corecore